# Abstractive summarization generation
Dansumt5 Large
Apache-2.0
Danish news summarization model based on the mT5-large architecture, trained on the DaNewsroom dataset, specifically designed for generating abstractive summaries of Danish news articles.
Text Generation
Transformers Other

D
Danish-summarisation
20
3
Mt5 Small HunSum 1
Hungarian abstractive summarization model trained on the mT5-small architecture using the HunSum-1 dataset
Text Generation
Transformers Other

M
SZTAKI-HLT
14
1
Mt5 Base Arabic
Apache-2.0
A summarization generation model fine-tuned on the xlsum Arabic dataset based on google/mt5-base
Text Generation
Transformers Arabic

M
eslamxm
24
0
Pegasus Newsroom
PEGASUS is an abstractive summarization pre-trained model based on gap sentence extraction, developed by Google Research, focusing on generating high-quality text summaries.
Text Generation
Transformers English

P
google
52
16
Distill Pegasus Xsum 16 4
PEGASUS is an abstractive summarization pre-training model based on extracted gap sentences, developed by the Google Research team.
Text Generation
Transformers English

D
sshleifer
137
4
It5 Summarization Fanpage
This model is a fine-tuned Italian abstractive summarization model based on gsarti/it5-base on the Fanpage dataset
Text Generation
Transformers Other

I
ARTeLab
16
2
Featured Recommended AI Models